A fully adaptive normalized nonlinear gradient descent algorithm for complex-valued nonlinear adaptive filters

نویسندگان

  • Andrew I. Hanna
  • Danilo P. Mandic
چکیده

A fully adaptive normalized nonlinear complex-valued gradient descent (FANNCGD) learning algorithm for training nonlinear (neural) adaptive finite impulse response (FIR) filters is derived. First, a normalized nonlinear complex-valued gradient descent (NNCGD) algorithm is introduced. For rigour, the remainder of the Taylor series expansion of the instantaneous output error in the derivation of NNCGD is made adaptive at every discrete time instant using a gradient-based approach. This results in the fully adaptive normalized nonlinear complex-valued gradient descent learning algorithm that is suitable for nonlinear complex adaptive filtering with a general holomorphic activation function and is robust to the initial conditions. Convergence analysis of the proposed algorithm is provided both analytically and experimentally. Experimental results on the prediction of colored and nonlinear inputs show the FANNCGD outperforming other algorithms of this kind.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A fully adaptive normalized nonlinear gradient descent algorithm for nonlinear system identification

A fully adaptive normalized nonlinear gradient descent (FANNGD) algorithm for neural adaptive filters employed for nonlinear system identification is proposed. This full adaptation is achieved using the instantaneous squared prediction error to adapt the free parameter of the NNGD algorithm. The convergence analysis of the proposed algorithm is undertaken using contractivity property of the non...

متن کامل

A Nonlinear Neural FIR Filter With An Adaptive Activation Function

An adaptive amplitude normalized nonlinear gradient descent (AANNGD) algorithm for the class of nonlinear finite impulse response (FIR) adaptive filters (dynamical perceptron) is introduced. This is achieved by making the amplitude of the nonlinear activation function gradient adaptive. The proposed learning algorithm is suitable for processing of nonlinear and nonstationary signals with a larg...

متن کامل

A complex-valued nonlinear neural adaptive filter with a gradient adaptive amplitude of the activation function

A complex-valued nonlinear gradient descent (CNGD) learning algorithm for a simple finite impulse response (FIR) nonlinear neural adaptive filter with an adaptive amplitude of the complex activation function is proposed. This way the amplitude of the complex-valued analytic nonlinear activation function of a neuron in the learning algorithm is made gradient adaptive to give the complex-valued a...

متن کامل

A Complex-Valued RTRL Algorithm for Recurrent Neural Networks

A complex-valued real-time recurrent learning (CRTRL) algorithm for the class of nonlinear adaptive filters realized as fully connected recurrent neural networks is introduced. The proposed CRTRL is derived for a general complex activation function of a neuron, which makes it suitable for nonlinear adaptive filtering of complex-valued nonlinear and nonstationary signals and complex signals with...

متن کامل

Utilizing Kernel Adaptive Filters for Speech Enhancement within the ALE Framework

Performance of the linear models, widely used within the framework of adaptive line enhancement (ALE), deteriorates dramatically in the presence of non-Gaussian noises. On the other hand, adaptive implementation of nonlinear models, e.g. the Volterra filters, suffers from the severe problems of large number of parameters and slow convergence. Nonetheless, kernel methods are emerging solutions t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE Trans. Signal Processing

دوره 51  شماره 

صفحات  -

تاریخ انتشار 2003